Kullback–Leibler divergence

Results: 486



#Item
191Estimation theory / Normal distribution / Histogram of oriented gradients / Kullback–Leibler divergence / Maximum likelihood / Laplace distribution / Variance / One-shot learning / Statistics / Statistical theory / Computer vision

Heavy-tailed Distances for Gradient Based Image Descriptors Yangqing Jia and Trevor Darrell UC Berkeley EECS and ICSI {jiayq,trevor}@eecs.berkeley.edu

Add to Reading List

Source URL: www.eecs.berkeley.edu

Language: English - Date: 2013-10-18 17:47:37
192Estimation theory / Bayesian statistics / M-estimators / Maximum likelihood / Mutual information / Principle of maximum entropy / Conditional entropy / Entropy / Kullback–Leibler divergence / Statistics / Information theory / Statistical theory

Log-Linear Models Noah A. Smith∗ Department of Computer Science / Center for Language and Speech Processing Johns Hopkins University [removed]

Add to Reading List

Source URL: www.cs.cmu.edu

Language: English - Date: 2006-11-30 01:37:48
193Probability and statistics / Thermodynamics / Statistical theory / Statistical mechanics / Philosophy of thermal and statistical physics / Entropy / Tsallis entropy / Q-exponential distribution / Kullback–Leibler divergence / Information theory / Thermodynamic entropy / Statistics

Norm-induced entropies for decision forests Christoph Lassner Rainer Lienhart Multimedia Computing and Computer Vision Lab, University of Augsburg [removed]

Add to Reading List

Source URL: www.multimedia-computing.de

Language: English - Date: 2014-11-14 11:06:55
194Mathematics / Communication / Data compression / Rate–distortion theory / Noisy-channel coding theorem / Data / Sturm–Liouville theory / Kullback–Leibler divergence / Information theory / Information / Coding theory

Nonasymptotic noisy lossy source coding Victoria Kostina, Sergio Verd´u Dept. of Electrical Engineering, Princeton University, NJ 08544, USA Abstract—This paper shows new general nonasymptotic achievability and conver

Add to Reading List

Source URL: www.princeton.edu

Language: English - Date: 2013-12-03 13:59:41
195M-estimators / Kullback–Leibler divergence / Thermodynamics / Information theory / Importance sampling / Jensen–Shannon divergence / Statistics / Estimation theory / Statistical theory

Universal Estimation of Divergence for Continuous Distributions via Data-Dependent Partitions Qing Wang, Sanjeev R. Kulkarni, Sergio Verd´u Department of Electrical Engineering Princeton University Princeton, NJ[removed]U

Add to Reading List

Source URL: www.princeton.edu

Language: English - Date: 2005-12-05 13:40:41
196Combinatorics / Integer sequences / Factorial / Number theory / Kullback–Leibler divergence / Estimation theory / Gamma distribution / Binomial coefficient / Mathematics / Statistics / Mathematical analysis

Best Arm Identification in Multi-Armed Bandits Jean-Yves Audibert Imagine, Universit´e Paris Est & Willow, CNRS/ENS/INRIA, Paris, France

Add to Reading List

Source URL: www.princeton.edu

Language: English - Date: 2011-06-28 04:16:37
197Information / Coding theory / Statistical theory / Randomness / Philosophy of thermal and statistical physics / Entropy / Kullback–Leibler divergence / Noisy-channel coding theorem / Quantities of information / Information theory / Statistics / Mathematics

CHAPTER 6 Shannon entropy This chapter is a digression in information theory. This is a fascinating subject, which arose once the notion of information got precise and quantifyable. From a physical point of view, inform

Add to Reading List

Source URL: www.ueltschi.org

Language: English - Date: 2006-11-29 16:35:20
198Information theory / Hierarchical clustering / Kullback–Leibler divergence / Entropy / Maximum likelihood / Cross entropy / Expectation–maximization algorithm / Cluster analysis / Determining the number of clusters in a data set / Statistics / Statistical theory / Principle of maximum entropy

Proceedings of ACL ’93, pp[removed]DISTRIBUTIONAL CLUSTERING OF ENGLISH WORDS Fernando Pereira AT&T Bell Laboratories

Add to Reading List

Source URL: www.cs.cornell.edu

Language: English - Date: 2002-08-29 18:38:04
199Data compression / Rate–distortion theory / Data / Kullback–Leibler divergence / Information theory / Information / Communication

A new converse in rate-distortion theory Victoria Kostina, Sergio Verd´u Dept. of Electrical Engineering, Princeton University, NJ, 08544, USA Abstract—This paper shows new finite-blocklength converse bounds applicabl

Add to Reading List

Source URL: www.princeton.edu

Language: English - Date: 2012-06-04 16:38:19
200Internet protocols / Internet standards / Routing protocols / Matrix / Inverse problem / Routing / Kullback–Leibler divergence / Normal distribution / Regularization / Internet / Statistics / Network architecture

An Information-Theoretic Approach to Traffic Matrix Estimation Yin Zhang Matthew Roughan Carsten Lund David Donoho

Add to Reading List

Source URL: conferences.sigcomm.org

Language: English - Date: 2003-07-24 11:26:26
UPDATE